Supplementary material for “ Covariance - Enhanced discriminant analysis ”
نویسندگان
چکیده
Proof of Theorem 1. The proof is summarized in the following three steps. First, we prove Qn(ω, μ∗,Ω∗) ≥ Qn(ω, μ∗,Ω∗) for ‖ω(1) − ω∗ (1)‖2 = Op(n). In Step 2, we show that Qn(ω, μ ∗,Ω∗) ≥ Qn(ω, μ∗,Ω) for ‖Ω− Ω‖F = Op{(pn + an) log pn/n}. In Step 3, we prove that Qn(ω, μ∗,Ω) ≥ Qn(ω, μ,Ω) for ‖μ − μ‖2 = Op(pn log pn/n). The following are the details. 20 Step 1. Let ∆ω(1) = ω(1) − ω∗ (1), and h(ω(1)) = ∑n i=1 ∑K k=1 τik logωk, where ωK = 1− ∑K−1 k=1 ωk. We denote by Jω = (δ1, . . . , δK) T the Jacobian matrix, where δk(1 ≤ k < K) is a (K − 1)-dimensional unit vector with the kth component being 1, and δK is a (K − 1)dimensional vector of ones. An application of Taylor expansion yields
منابع مشابه
Understanding and Evaluating Sparse Linear Discriminant Analysis: Supplementary File
To begin, we first use B to project all the training data into a low-dimensional discriminant space. Let Σw and μk denote the within-class pooled covariance matrix and the mean of class k respectively for all the projected data. When given a new observation x, we assume that it is draw from a normal distribution with mean given by some μk and covariance given by Σ B w . We then assign x ? to th...
متن کاملA New Feature Extraction Method Based on the Information Fusion of Entropy Matrix and Covariance Matrix and Its Application in Face Recognition
The classic principal components analysis (PCA), kernel PCA (KPCA) and linear discriminant analysis (LDA) feature extraction methods evaluate the importance of components according to their covariance contribution, not considering the entropy contribution, which is important supplementary information for the covariance. To further improve the covariance-based methods such as PCA (or KPCA), this...
متن کاملCovariance-enhanced discriminant analysis.
Linear discriminant analysis has been widely used to characterize or separate multiple classes via linear combinations of features. However, the high dimensionality of features from modern biological experiments defies traditional discriminant analysis techniques. Possible interfeature correlations present additional challenges and are often underused in modelling. In this paper, by incorporati...
متن کاملBetween-Class Covariance Correction For Linear Discriminant Analysis in Language Recognition
Linear Discriminant Analysis (LDA) is one of the most widely-used channel compensation techniques in current speaker and language recognition systems. In this study, we propose a technique of Between-Class Covariance Correction (BCC) to improve language recognition performance. This approach builds on the idea of WithinClass Covariance Correction (WCC), which was introduced as a means to compen...
متن کامل